# Masked language model
Italian ModernBERT Base
Apache-2.0
Italian ModernBERT is a specialized version of ModernBERT for Italian language, pre-trained specifically on Italian text.
Large Language Model
Transformers Other

I
DeepMount00
119
2
Bartsmiles
BARTSmiles is a generative masked language model based on the BART architecture, specifically designed for molecular representation and SMILES string processing.
Molecular Model
Transformers

B
gayane
260
3
Phayathaibert
PhayaThaiBERT is a foundational BERT model based on the Thai language, suitable for Thai text processing tasks.
Large Language Model
Transformers Other

P
clicknext
640
6
Gpn Msa Sapiens
MIT
GPN-MSA is a language model trained on human and 89 other vertebrate species, specializing in DNA sequence analysis and variant effect prediction.
Protein Model
Transformers Other

G
songlab
446
8
Ruleanalbert
Apache-2.0
RuLeanALBERT is a memory-efficient masked language model pretrained specifically for Russian.
Large Language Model
Transformers Other

R
yandex
80
35
Roberta Small Belarusian
This is a RoBERTa model pretrained on the CC-100 dataset, suitable for Belarusian text processing tasks.
Large Language Model
Transformers Other

R
KoichiYasuoka
234
5
Takalane Tsn Roberta
MIT
This is a masked language model focused on the Tswana language, aiming to enhance the performance of low-resource South African languages in the field of NLP.
Large Language Model Other
T
jannesg
24
0
Alephbert Base
Apache-2.0
AlephBERT is a cutting-edge language model for Hebrew, based on Google's BERT architecture, specifically designed for processing Hebrew text.
Large Language Model
Transformers Other

A
biu-nlp
26
0
Vetbert
Openrail
VetBERT is a pretrained language model specifically designed for processing veterinary clinical notes, further trained on veterinary medical domain data based on Bio_ClinicalBERT.
Large Language Model
Transformers English

V
havocy28
87
2
Roberta Go
This is a RoBERTa model pre-trained on the CodeSearchNet dataset, specifically designed for Golang masked language modeling tasks.
Large Language Model
R
dbernsohn
22
0
Tunbert Zied
tunbert_zied is a language model based on the Tunisian dialect, with an architecture similar to RoBERTa, trained on over 600,000 Tunisian dialect phrases.
Large Language Model
Transformers

T
ziedsb19
19
2
Roberta Base Thai Char
Apache-2.0
This is a RoBERTa model pre-trained on Thai Wikipedia text, using character-level embeddings to adapt to BertTokenizerFast.
Large Language Model
Transformers Other

R
KoichiYasuoka
23
0
Roberta Small Japanese Aozora
A small Japanese RoBERTa model pre-trained on Aozora Bunko texts, suitable for various downstream NLP tasks
Large Language Model
Transformers Japanese

R
KoichiYasuoka
19
0
Bert Large Japanese Char Extended
This is a BERT model pre-trained on Japanese Wikipedia text, derived from bert-large-japanese-char, with enhanced character embedding capabilities to support more Kanji characters.
Large Language Model
Transformers Japanese

B
KoichiYasuoka
18
0
Roberta Small Japanese Aozora Char
A RoBERTa model pretrained on Aozora Bunko texts using a character tokenizer, suitable for Japanese text processing tasks.
Large Language Model
Transformers Japanese

R
KoichiYasuoka
26
1
Featured Recommended AI Models